Search results for "Regression problems"
showing 3 items of 3 documents
Some extensions of multivariate sliced inverse regression
2007
Multivariate sliced inverse regression (SIR) is a method for achieving dimension reduction in regression problems when the outcome variable y and the regressor x are both assumed to be multidimensional. In this paper, we extend the existing approaches, based on the usual SIR I which only uses the inverse regression curve, to methods using properties of the inverse conditional variance. Contrary to the existing ones, these new methods are not blind for symmetric dependencies and rely on the SIR II or SIRα. We also propose their corresponding pooled slicing versions. We illustrate the usefulness of these approaches on simulation studies.
ELM Regularized Method for Classification Problems
2016
Extreme Learning Machine (ELM) is a recently proposed algorithm, efficient and fast for learning the parameters of single layer neural structures. One of the main problems of this algorithm is to choose the optimal architecture for a given problem solution. To solve this limitation several solutions have been proposed in the literature, including the regularization of the structure. However, to the best of our knowledge, there are no works where such adjustment is applied to classification problems in the presence of a non-linearity in the output; all published works tackle modelling or regression problems. Our proposal has been applied to a series of standard databases for the evaluation o…
Generalized Multitarget Linear Regression with Output Dependence Estimation
2019
Multitarget regression has recently received attention in the context of modern, large-scale problems in which finding good enough solutions in a timely manner is crucial. Different proposed alternatives use a combination of regularizers that lead to different ways of solving the problem. In this work, we introduce a general formulation with several regularizers. This leads to a biconvex minimization problem and we use an alternating procedure with accelerated proximal gradient steps to solve it. We show that our formulation is equivalent but more efficient than some previously proposed approaches. Moreover, we introduce two new variants. The experimental validation carried out, suggests th…